Non-linear Information Inequalities

نویسندگان

  • Terence Chan
  • Alex J. Grant
چکیده

We construct non-linear information inequalities from Matúš’ infinite series of linear information inequalities. Each single non-linear inequality is sufficiently strong to prove that the closure of the set of all entropy functions is not polyhedral for four or more random variables, a fact that was already established using the series of linear inequalities. To the best of our knowledge, they are the first non-trivial examples of non-linear information inequalities.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Fisher Information Inequalities and Score Functions in Non-invertible Linear Systems

In this note, we review score functions properties and discuss inequalities on the Fisher Information Matrix of a random vector subjected to linear non-invertible transformations. We give alternate derivations of results previously published in [6] and provide new interpretations of the cases of equality.

متن کامل

A New Class of Non-shannon-type Inequalities for Entropies∗

In this paper we prove a countable set of non-Shannon-type linear information inequalities for entropies of discrete random variables, i.e., information inequalities which cannot be reduced to the “basic” inequality I(X : Y |Z) ≥ 0. Our results generalize the inequalities of Z. Zhang and R. Yeung (1998) who found the first examples of non-Shannon-type information inequalities.

متن کامل

Bell inequalities from multilinear contractions

We provide a framework for Bell inequalities which is based on multilinear contractions. The derivation of the inequalities allows for an intuitive geometric depiction and their violation within quantum mechanics can be seen as a direct consequence of non-vanishing commutators. The approach is motivated by generalizing recent work on non-linear inequalities which was based on the moduli of comp...

متن کامل

Non-Shannon Information Inequalities in Four Random Variables

Any unconstrained information inequality in three or fewer random variables can be written as a linear combination of instances of Shannon’s inequality I(A;B|C) ≥ 0. Such inequalities are sometimes referred to as “Shannon” inequalities. In 1998, Zhang and Yeung gave the first example of a “nonShannon” information inequality in four variables. Their technique was to add two auxiliary variables w...

متن کامل

Lifting inequalities for polytopes

We present a method of lifting linear inequalities for the flag f -vector of polytopes to higher dimensions. Known inequalities that can be lifted using this technique are the non-negativity of the toric g-vector and that the simplex minimizes the cd-index. We obtain new inequalities for 6-dimensional polytopes. In the last section we present the currently best known inequalities for dimensions...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Entropy

دوره 10  شماره 

صفحات  -

تاریخ انتشار 2008